-
Notifications
You must be signed in to change notification settings - Fork 371
Index converter dynamic cases fix #3694
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
The index dtype validator doesn't support bool type (Pdb) node.args
(_reshape_copy_597, [eq_1427])
(Pdb) node.args
(_reshape_copy_597, [eq_1427])
(Pdb) node.args[0].dtype
*** AttributeError: 'Node' object has no attribute 'dtype'
(Pdb) node.args[0].meta['val']
FakeTensor(..., device='cuda:0', size=(s43*s53, 2048), dtype=torch.float16)
(Pdb) node.args[0].meta['val'].dtype
torch.float16
(Pdb) node.args[1][0].meta['val'].dtype
torch.bool The inputs to index are FP16 and bool. In this case, converter is not being invoked and falls back to PyTorch |
oh ok, looks like in https://github.com/zdevito/ATen/blob/master/aten/src/ATen/native/native_functions.yaml#L1324 tensor indices as bool should denote mask then. I think a non zero layer to get the corresponding indices should work. I will push in the change |
e5aae73
to
a14fc4c
Compare
1834500
to
663fbe3
Compare
@@ -53,6 +53,65 @@ def select( | |||
return layer.get_output(0) | |||
|
|||
|
|||
def is_boolean_tensor(tensor: Union[TRTTensor, np.ndarray, torch.Tensor]) -> bool: | |||
if isinstance(tensor, (TRTTensor)): | |||
val = tensor.meta.get("val") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can we directly use tensor.dtype == torch.bool ?
I hit with issue where tensor has no meta in gemma3 model.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One comment, please take a look, others LGTM
This PR-